36 research outputs found

    PynPoint: a modular pipeline architecture for processing and analysis of high-contrast imaging data

    Full text link
    The direct detection and characterization of planetary and substellar companions at small angular separations is a rapidly advancing field. Dedicated high-contrast imaging instruments deliver unprecedented sensitivity, enabling detailed insights into the atmospheres of young low-mass companions. In addition, improvements in data reduction and PSF subtraction algorithms are equally relevant for maximizing the scientific yield, both from new and archival data sets. We aim at developing a generic and modular data reduction pipeline for processing and analysis of high-contrast imaging data obtained with pupil-stabilized observations. The package should be scalable and robust for future implementations and in particular well suitable for the 3-5 micron wavelength range where typically (ten) thousands of frames have to be processed and an accurate subtraction of the thermal background emission is critical. PynPoint is written in Python 2.7 and applies various image processing techniques, as well as statistical tools for analyzing the data, building on open-source Python packages. The current version of PynPoint has evolved from an earlier version that was developed as a PSF subtraction tool based on PCA. The architecture of PynPoint has been redesigned with the core functionalities decoupled from the pipeline modules. Modules have been implemented for dedicated processing and analysis steps, including background subtraction, frame registration, PSF subtraction, photometric and astrometric measurements, and estimation of detection limits. The pipeline package enables end-to-end data reduction of pupil-stabilized data and supports classical dithering and coronagraphic data sets. As an example, we processed archival VLT/NACO L' and M' data of beta Pic b and reassessed the planet's brightness and position with an MCMC analysis, and we provide a derivation of the photometric error budget.Comment: 16 pages, 9 figures, accepted for publication in A&A, PynPoint is available at https://github.com/PynPoint/PynPoin

    Comparing Apples with Apples: Robust Detection Limits for Exoplanet High-Contrast Imaging in the Presence of non-Gaussian Noise

    Full text link
    Over the past decade, hundreds of nights have been spent on the worlds largest telescopes to search for and directly detect new exoplanets using high-contrast imaging (HCI). Thereby, two scientific goals are of central interest: First, to study the characteristics of the underlying planet population and distinguish between different planet formation and evolution theories. Second, to find and characterize planets in our immediate Solar neighborhood. Both goals heavily rely on the metric used to quantify planet detections and non-detections. Current standards often rely on several explicit or implicit assumptions about the noise. For example, it is often assumed that the residual noise after data post-processing is Gaussian. While being an inseparable part of the metric, these assumptions are rarely verified. This is problematic as any violation of these assumptions can lead to systematic biases. This makes it hard, if not impossible, to compare results across datasets or instruments with different noise characteristics. We revisit the fundamental question of how to quantify detection limits in HCI. We focus our analysis on the error budget resulting from violated assumptions. To this end, we propose a new metric based on bootstrapping that generalizes current standards to non-Gaussian noise. We apply our method to archival HCI data from the NACO-VLT instrument and derive detection limits for different types of noise. Our analysis shows that current standards tend to give detection limit that are about one magnitude too optimistic in the speckle-dominated regime. That is, HCI surveys may have excluded planets that can still exist.Comment: After first iteration with the referee, resubmitted to AJ. Comments welcome

    CROCODILE \\ Incorporating medium-resolution spectroscopy of close-in directly imaged exoplanets into atmospheric retrievals via cross-correlation

    Full text link
    The investigation of the atmospheres of closely separated, directly imaged gas giant exoplanets is challenging due to the presence of stellar speckles that pollute their spectrum. To remedy this, the analysis of medium- to high-resolution spectroscopic data via cross-correlation with spectral templates (cross-correlation spectroscopy) is emerging as a leading technique. We aim to define a robust Bayesian framework combining, for the first time, three widespread direct-imaging techniques, namely photometry, low-resolution spectroscopy, and medium-resolution cross-correlation spectroscopy in order to derive the atmospheric properties of close-in directly imaged exoplanets. Our framework CROCODILE (cross-correlation retrievals of directly imaged self-luminous exoplanets) naturally combines the three techniques by adopting adequate likelihood functions. To validate our routine, we simulated observations of gas giants similar to the well-studied β\beta~Pictoris~b planet and we explored the parameter space of their atmospheres to search for potential biases. We obtain more accurate measurements of atmospheric properties when combining photometry, low- and medium-resolution spectroscopy into atmospheric retrievals than when using the techniques separately as is usually done in the literature. We find that medium-resolution (R4000R \approx 4000) K-band cross-correlation spectroscopy alone is not suitable to constrain the atmospheric properties of our synthetic datasets; however, this problem disappears when simultaneously fitting photometry and low-resolution (R60R \approx 60) spectroscopy between the Y and M bands. Our framework allows the atmospheric characterisation of directly imaged exoplanets using the high-quality spectral data that will be provided by the new generation of instruments such as VLT/ERIS, JWST/MIRI, and ELT/METIS

    ISPY-NACO Imaging Survey for Planets around Young stars. The demographics of forming planets embedded in protoplanetary disks

    Full text link
    We present the statistical analysis of a subsample of 45 young stars surrounded by protoplanetary disks (PPDs). This is the largest imaging survey uniquely focused on PPDs to date. Our goal is to search for young forming companions embedded in the disk material and to constrain their occurrence rate in relation to the formation mechanism. We used principal component analysis based point spread function subtraction techniques to reveal young companions forming in the disks. We calculated detection limits for our datasets and adopted a black-body model to derive temperature upper limits of potential forming planets. We then used Monte Carlo simulations to constrain the population of forming gas giant companions and compare our results to different types of formation scenarios. Our data revealed a new binary system (HD38120) and a recently identified triple system with a brown dwarf companion orbiting a binary system (HD101412), in addition to 12 known companions. Furthermore, we detected signals from 17 disks, two of which (HD72106 and TCrA) were imaged for the first time. We reached median detection limits of L =15.4 mag at 2.0 arcsec, which were used to investigate the temperature of potentially embedded forming companions. We can constrain the occurrence of forming planets with semi-major axis a in [20 - 500] au and Teff in [600 - 3000] K, in line with the statistical results obtained for more evolved systems from other direct imaging surveys. The NaCo-ISPY data confirm that massive bright planets accreting at high rates are rare. More powerful instruments with better sensitivity in the near- to mid-infrared are likely required to unveil the wealth of forming planets sculpting the observed disk substructures.Comment: 25 pages, 16 figures, 3 tables, accepted for publication in A&

    Single Femtosecond Laser-Pulse-Induced Superficial Amorphization and Re-Crystallization of Silicon

    Get PDF
    21 pags., 9 figs., 1 tab. -- This article belongs to the Special Issue Advanced Pulse Laser Machining TechnologySuperficial amorphization and re-crystallization of silicon in and orientation after irradiation by femtosecond laser pulses (790 nm, 30 fs) are studied using optical imaging and transmission electron microscopy. Spectroscopic imaging ellipsometry (SIE) allows fast data acquisition at multiple wavelengths and provides experimental data for calculating nanometric amorphous layer thickness profiles with micrometric lateral resolution based on a thin-film layer model. For a radially Gaussian laser beam and at moderate peak fluences above the melting and below the ablation thresholds, laterally parabolic amorphous layer profiles with maximum thicknesses of several tens of nanometers were quantitatively attained. The accuracy of the calculations is verified experimentally by high-resolution transmission electron microscopy (HRTEM) and energy dispersive X-ray spectroscopy (STEM-EDX). Along with topographic information obtained by atomic force microscopy (AFM), a comprehensive picture of the superficial re-solidification of silicon after local melting by femtosecond laser pulses is drawn.C.F. acknowledges the support from the European Commission through the Marie Curie Individual Fellowship—Global grant No. 844977 and funding from the Horizon 2020 CellFreeImplant European project. D.F., M.D., S.S., A.H. and U.B. gratefully acknowledge the funding from the German Central Innovation Program (AiF-ZIM) under grants No. ZF4044219AB7 and ZF4460401AB7. K.F., M.R. and A.U. acknowledge support by the German Research Foundation (grant Nos. UN 341/3-1 and Inst 275/391-1). J.B. acknowledges the projects CellFreeImplant and LaserImplant. These two projects have received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreements No. 800832 (CellFreeImplant) and No. 951730 (LaserImplant).Peer reviewe

    Exoplanet imaging data challenge: benchmarking the various image processing methods for exoplanet detection

    Get PDF
    The Exoplanet Imaging Data Challenge is a community-wide effort meant to offer a platform for a fair and common comparison of image processing methods designed for exoplanet direct detection. For this purpose, it gathers on a dedicated repository (Zenodo), data from several high-contrast ground-based instruments worldwide in which we injected synthetic planetary signals. The data challenge is hosted on the CodaLab competition platform, where participants can upload their results. The specifications of the data challenge are published on our website https://exoplanet-imaging-challenge.github.io/. The first phase, launched on the 1st of September 2019 and closed on the 1st of October 2020, consisted in detecting point sources in two types of common data-set in the field of high-contrast imaging: data taken in pupil-tracking mode at one wavelength (subchallenge 1, also referred to as ADI) and multispectral data taken in pupil-tracking mode (subchallenge 2, also referred to as ADI+mSDI). In this paper, we describe the approach, organisational lessons-learnt and current limitations of the data challenge, as well as preliminary results of the participants’ submissions for this first phase. In the future, we plan to provide permanent access to the standard library of data sets and metrics, in order to guide the validation and support the publications of innovative image processing algorithms dedicated to high-contrast imaging of planetary systems

    Exoplanet imaging data challenge, phase II: characterization of exoplanet signals in high-contract images

    Full text link
    peer reviewedToday, there exists a wide variety of algorithms dedicated to high-contrast imaging, especially for the detection and characterisation of exoplanet signals. These algorithms are tailored to address the very high contrast between the exoplanet signal(s), which can be more than two orders of magnitude fainter than the bright starlight residuals in coronagraphic images. The starlight residuals are inhomogeneously distributed and follow various timescales that depend on the observing conditions and on the target star brightness. Disentangling the exoplanet signals within the starlight residuals is therefore challenging, and new post-processing algorithms are striving to achieve more accurate astrophysical results. The Exoplanet Imaging Data Challenge is a community-wide effort to develop, compare and evaluate algorithms using a set of benchmark high-contrast imaging datasets. After a first phase ran in 2020 and focused on the detection capabilities of existing algorithms, the focus of this ongoing second phase is to compare the characterisation capabilities of state-of-the-art techniques. The characterisation of planetary companions is two-fold: the astrometry (estimated position with respect to the host star) and spectrophotometry (estimated contrast with respect to the host star, as a function of wavelength). The goal of this second phase is to offer a platform for the community to benchmark techniques in a fair, homogeneous and robust way, and to foster collaborations

    Integrated photonic-based coronagraphic systems for future space telescopes

    Full text link
    The detection and characterization of Earth-like exoplanets around Sun-like stars is a primary science motivation for the Habitable Worlds Observatory. However, the current best technology is not yet advanced enough to reach the 10^-10 contrasts at close angular separations and at the same time remain insensitive to low-order aberrations, as would be required to achieve high-contrast imaging of exo-Earths. Photonic technologies could fill this gap, potentially doubling exo-Earth yield. We review current work on photonic coronagraphs and investigate the potential of hybridized designs which combine both classical coronagraph designs and photonic technologies into a single optical system. We present two possible systems. First, a hybrid solution which splits the field of view spatially such that the photonics handle light within the inner working angle and a conventional coronagraph that suppresses starlight outside it. Second, a hybrid solution where the conventional coronagraph and photonics operate in series, complementing each other and thereby loosening requirements on each subsystem. As photonic technologies continue to advance, a hybrid or fully photonic coronagraph holds great potential for future exoplanet imaging from space.Comment: Conference Proceedings of SPIE: Techniques and Instrumentation for Detection of Exoplanets XI, vol. 12680 (2023

    Visible extreme adaptive optics on extremely large telescopes: Towards detecting oxygen in Proxima Centauri b and analogs

    Full text link
    Looking to the future of exo-Earth imaging from the ground, core technology developments are required in visible extreme adaptive optics (ExAO) to enable the observation of atmospheric features such as oxygen on rocky planets in visible light. UNDERGROUND (Ultra-fast AO techNology Determination for Exoplanet imageRs from the GROUND), a collaboration built in Feb. 2023 at the Optimal Exoplanet Imagers Lorentz Workshop, aims to (1) motivate oxygen detection in Proxima Centauri b and analogs as an informative science case for high-contrast imaging and direct spectroscopy, (2) overview the state of the field with respect to visible exoplanet imagers, and (3) set the instrumental requirements to achieve this goal and identify what key technologies require further development.Comment: SPIE Proceeding: 2023 / 12680-6
    corecore